AdaBoost and neural networks

نویسندگان

  • Terry Windeatt
  • Reza Ghaderi
چکیده

AdaBoost, a recent v ersion of Boosting is known to improve the performance of decision trees in many classiication problems, but in some cases it does not do as well as expected. There are also a few reports of its application to more complex classiiers such as neural networks. In this paper we decompose and modify this algorithm for use with RBF NNs, our methodology being based on the technique of combining multiple clas-siiers.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Comparing Prediction Power of Artificial Neural Networks Compound Models in Predicting Credit Default Swap Prices through Black–Scholes–Merton Model

Default risk is one of the most important types of risks, and credit default swap (CDS) is one of the most effective financial instruments to cover such risks. The lack of these instruments may reduce investment attraction, particularly for international investors, and impose potential losses on the economy of the countries lacking such financial instruments, among them, Iran. After the 2007 fi...

متن کامل

Detecting the intensity of cognitive and physical load using AdaBoost and deep rectifier neural networks

The Interspeech ComParE 2014 Challenge consists of two machine learning tasks, which have quite a small number of examples. Due to our good results in ComParE 2013, we considered AdaBoost a suitable machine learning meta-algorithm for these tasks, besides we also experimented with Deep Rectifier Neural Networks. These differ from traditional neural networks in that the former have several hidde...

متن کامل

A Parallel Adaboost-Backpropagation Neural Network for Massive Image Dataset Classification

Image classification uses computers to simulate human understanding and cognition of images by automatically categorizing images. This study proposes a faster image classification approach that parallelizes the traditional Adaboost-Backpropagation (BP) neural network using the MapReduce parallel programming model. First, we construct a strong classifier by assembling the outputs of 15 BP neural...

متن کامل

Averaged Conservative Boosting: Introducing a New Method to Build Ensembles of Neural Networks

In this paper, a new algorithm called Averaged Conservative Boosting (ACB) is presented to build ensembles of neural networks. In ACB we mix the improvements that Averaged Boosting (Aveboost) and Conservative Boosting (Conserboost) made to Adaptive Boosting (Adaboost). In the algorithm we propose we have applied the conservative equation used in Conserboost along with the averaged procedure use...

متن کامل

The Application of an Ensemble of Boosted Elman Networks to Time Series Prediction: A Benchmark Study

In this paper, the application of multiple Elman neural networks to time series data regression problems is studied. An ensemble of Elman networks is formed by boosting to enhance the performance of the individual networks. A modified version of the AdaBoost algorithm is employed to integrate the predictions from multiple networks. Two benchmark time series data sets, i.e., the Sunspot and Box-...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1999